Goto

Collaborating Authors

 gas turbine


Despite Protests, Elon Musk Secures Air Permit for xAI

WIRED

A local health department in Memphis has granted Elon Musk's xAI data center an air permit to continue operating the gas turbines that power the company's Grok chatbot. The permit comes amid widespread community opposition and a looming lawsuit alleging the company violated the Clean Air Act. The Shelby County Health Department released its air permit for the xAI project Wednesday, after receiving hundreds of public comments. The news was first reported by the Daily Memphian. In June, the Memphis Chamber of Commerce announced that xAI had chosen a site in Memphis to build its new supercomputer.


Elon Musk's poison-spewing prized possession faces shutdown that could reshape America

Daily Mail - Science & tech

Elon Musk quickly began pulling the plug on federal projects amid an escalating feud with Donald Trump, but the high-stakes clash now further threatens the most prized asset in his empire. With Tesla shares in freefall and SpaceX contracts on the line, the ambitious megaproject Musk most needs to compete in the AI race could become collateral in the explosive back-and-forth. Built in Tennessee, the supercomputer Colossus powers Musk's artificial intelligence company, xAI. The vast facility cost an estimated 4 billion and Musk plans to spend tens of billions more expanding it in a bid to challenge AI giants OpenAI and Google. However, it is already mired in an explosive backlash that the president could seize on.


Elon Musk's xAI accused of pollution over Memphis supercomputer

The Guardian

Elon Musk's artificial intelligence company is stirring controversy in Memphis, Tennessee. That's where he's building a massive supercomputer to power his company xAI. Community residents and environmental activists say that since the supercomputer was fired up last summer it has become one of the biggest air polluters in the county. But some local officials have championed the billionaire, saying he's investing in Memphis. The first public hearing with the health department is scheduled for Friday, where county officials will hear from all sides of the debate.


ExplainReduce: Summarising local explanations via proxies

Seppäläinen, Lauri, Guo, Mudong, Puolamäki, Kai

arXiv.org Artificial Intelligence

Most commonly used non-linear machine learning methods are closed-box models, uninterpretable to humans. The field of explainable artificial intelligence (XAI) aims to develop tools to examine the inner workings of these closed boxes. An often-used model-agnostic approach to XAI involves using simple models as local approximations to produce so-called local explanations; examples of this approach include LIME, SHAP, and SLISEMAP. This paper shows how a large set of local explanations can be reduced to a small "proxy set" of simple models, which can act as a generative global explanation. This reduction procedure, ExplainReduce, can be formulated as an optimisation problem and approximated efficiently using greedy heuristics.


Tabular Machine Learning Methods for Predicting Gas Turbine Emissions

Potts, Rebecca, Hackney, Rick, Leontidis, Georgios

arXiv.org Artificial Intelligence

Predicting emissions for gas turbines is critical for monitoring harmful pollutants being released into the atmosphere. In this study, we evaluate the performance of machine learning models for predicting emissions for gas turbines. We compare an existing predictive emissions model, a first principles-based Chemical Kinetics model, against two machine learning models we developed based on SAINT and XGBoost, to demonstrate improved predictive performance of nitrogen oxides (NOx) and carbon monoxide (CO) using machine learning techniques. Our analysis utilises a Siemens Energy gas turbine test bed tabular dataset to train and validate the machine learning models. Additionally, we explore the trade-off between incorporating more features to enhance the model complexity, and the resulting presence of increased missing values in the dataset.


In-flight Novelty Detection with Convolutional Neural Networks

Hartwell, Adam, Montana, Felipe, Jacobs, Will, Kadirkamanathan, Visakan, Mills, Andrew R, Clark, Tom

arXiv.org Artificial Intelligence

Gas turbine engines are complex machines that typically generate a vast amount of data, and require careful monitoring to allow for cost-effective preventative maintenance. In aerospace applications, returning all measured data to ground is prohibitively expensive, often causing useful, high value, data to be discarded. The ability to detect, prioritise, and return useful data in real-time is therefore vital. This paper proposes that system output measurements, described by a convolutional neural network model of normality, are prioritised in real-time for the attention of preventative maintenance decision makers. Due to the complexity of gas turbine engine time-varying behaviours, deriving accurate physical models is difficult, and often leads to models with low prediction accuracy and incompatibility with real-time execution. Data-driven modelling is a desirable alternative producing high accuracy, asset specific models without the need for derivation from first principles. We present a data-driven system for online detection and prioritisation of anomalous data. Biased data assessment deriving from novel operating conditions is avoided by uncertainty management integrated into the deep neural predictive model. Testing is performed on real and synthetic data, showing sensitivity to both real and synthetic faults. The system is capable of running in real-time on low-power embedded hardware and is currently in deployment on the Rolls-Royce Pearl 15 engine flight trials.



Act Now to Prevent Regulatory Derailment of the AI Boom - RTInsights

#artificialintelligence

If businesses do not adopt their own best practices in addressing AI transparency and bias issues, they may not have a choice in the future. Continuous intelligence (CI) relies on the use of artificial intelligence (AI) and machine learning (ML) to derive actionable information in milliseconds to minutes from streaming data. Adoption is booming, but one obstacle could derail industry efforts if not addressed immediately. That obstacle is regulatory oversight and interference. The looming problem relates to the way AI and ML are used in CI applications today.


AI-Inspired Project Advances Condition-Based Maintenance and Prediction for Turbines

#artificialintelligence

An NETL-sponsored project is leveraging artificial intelligence in a manner that will lead to more efficient, long-lasting and reliable gas turbines to meet America's growing energy needs. As advanced energy systems move toward higher temperatures to boost efficiency and reduce emissions, monitoring their performance under such harsh conditions becomes a challenge. Existing monitoring tools for gas turbines are costly, time-consuming and complicated, involving wires and risks for damage. With funding and guidance from NETL, North Carolina-based Siemens Corp. and its partners are developing smart sensor systems that provide real-time monitoring of gas turbine components, thereby enabling condition-based maintenance and prediction of each component's remaining useful life. Siemens is achieving this by incorporating advanced sensors directly onto gas turbine blades that feature an ultra-high-temperature wireless monitoring and data transfer capability that provides accurate, real-time information up to 1,400 degrees Celsius.


Is data enough to win in AI?

#artificialintelligence

The common narrative in the tech industry is that those companies with access to the most data are best positioned to win with machine learning. The apparent logic of this positive feedback loop is simple: more/better data better models more accurate predictions better product or service more customers more data, etc. In his December 2018 blog post, Benedict Evans argues that this is too simplistic. It is critical to consider what data, for what problems, for which use cases? Google, Facebook, or China will not "just win" because of some combination of existing data assets, scale, and permissiveness around user data.